Introduction

The purpose of this project is to develop a model that will predict the popularity of Spotify songs.

What is Spotify?

Spotify is an audio streaming service that was founded on April 23, 2006 in Stockholm, Sweden by Daniel Ek and Martin Lorentzon. This service includes both music and podcasts with its main usage being for music. Spotify has millions of songs with collections of music providing both a free and premium option for its subscribers.

knitr::include_graphics("images/spotifylogo.png")

Why are we doing this?

Music brings the world together but have you ever wondered what makes some of the songs you love so great? The research question I am proposing is “What characteristics in a song can be used to determine how popular it can get?” The main motive is to find key factors that affects virality and trends that could be followed. As an avid Spotify user, I catch myself playing certain songs on repeat and knowing all the words to some of the greatest hits. It would be interesting to see if there are patterns with these songs and if they are overlapping attributes. By creating a model, this will allow Spotify users to figure out common patterns in the top Spotify songs and what makes them great hits and provide insight to artists about what they can produce in hopes of making a popular song.

knitr::include_graphics("images/taylorswift.png")

Data Description

The data set being used is from Kaggle and it is called “Top Spotify Songs From 2010-2019 By Year” created by Leonardo Henrique. With this data, I can analyze popular songs and create a model that find potential trends.

Project Outline

Let’s discuss the plan for how to tackle this project. Fortunately, this data has been cleaned and there are no missing values. First, we will load the data and then perform exploratory data analysis. We will use all the predictors given and have pop be the response variable. pop indicates the value of how popular the song is with 0 being not popular and 100 being the most popular, which is a categorical variable. Linear Regression, Ridge Regression, K-Nearest Neighbors, Elastic Net, Random Forest, Support Vector Machine, and Boosted Trees. I will be used as models for the training data. From there, we will use it on the testing data to see how well our model actually did.

Exploratory Data Analysis

Before we start building models, we will begin with a general analysis of our data and figure out what we are working with.

# Loading all necessary packages
library(readr)
library(janitor)
library(dplyr)
library(tidymodels)
library(kknn)
library(glmnet)
library(corrr)
library(corrplot)
library(tidyverse)
library(finalfit)
library(kableExtra)
library(kernlab)
library(xgboost)

Loading the Data

# Loading the data
spotify <- read_csv("top10s.csv", show_col_types = FALSE)

# Cleaning predictor names
spotify <- clean_names(spotify)
set.seed(100)

# Seeing first few rows of data set
head(spotify)
dim(spotify)
## [1] 603  15

There are 603 observations and 15 variables. This is a good amount of observations and the amount of predictors we have is also good. As for our predictors, we use bpm, nrgy, dnce, d_b, live, val, dur, acous, and spch. pop will be our response.

Missing and Tidying Data

# Missing values (none)
spotify %>%
  missing_plot()

Luckily, there are no missing observations in our data set and we can use all the predictors for our model! As for tidying the data, the data set is pretty efficient and all components can and will be used.

Describing the Variables

After looking all the possible predictors, I have decided to keep all the numeric variables since they provide good insight on the response variable, pop. I decided not to use top_genre as a predictor because the majority of the songs fall under some category of pop (this will be shown in the Visual EDA portion). So, here are the variables that will be used:

  • bpm: Beats Per Minute - the tempo of the song
  • nrgy: Energy - the energy of the song, the higher the value, the more energetic the song
  • dnce: Danceability - the higher the value, the easier it is to dance to the song
  • d_b: Loudness - the higher the value, the louder the song
  • live: Liveness - the higher the value, the more likely the song is a live recording
  • val: Valence - the higher the value, the more positive mood for the song
  • dur: Length - the duration of the song
  • acous: Acousticness - the higher the value, the more acoustic the song is
  • spch: Speechiness - the higher the value, the more spoken word the song contains
  • pop: Popularity - the higher the value, the more popular the song is

Visual EDA

Variable Correlation Plot

Let’s check out a correlation heat map to see if there is a relationship between the variables.

spotify_numeric <- spotify %>%
  select_if(is.numeric) %>%
  select(-x1) %>%
  select(-year)
spotify_cor <- cor(spotify_numeric)
spotify_cor_plt <- corrplot(spotify_cor, col = COL2("BrBG"))

When looking at the correlation plot, I thought that there would be more correlation between the variables. The ones that have a stronger correlation and the ones that were not correlated at strongly made sense. As shown above, there is a correlation between d_b and nrgy which makes sense because the louder the music, it would typically also mean that the energy of the song is higher. In addition, there is also a correlation between val and dnce. This is expected because most songs people dance too are typically more positive songs that are more upbeat. On the other hand, there is a negative correlation between ngry and acous which is also expected because acoustic songs are much more mellow and would lack energy.

Popularity of Spotify Songs Distribution

spotify %>% 
  ggplot(aes(x = pop)) +
  geom_bar(fill = "darkseagreen") + 
  labs(x = "Popularity", y = "# of Songs", title = "Distribution of Popularity of Spotify Songs")

For this data, we can see that most of the popularity for these Spotify Songs are right-skewed and that these songs are mostly very popular. According to the graph, most of these songs lie within the 60-80 range.

Count of the Genres of Spotify Songs

# Grouping genres into the most popular 3 and "Other" genre
spotify$top_genre <- factor(spotify$top_genre)
spotify$top_genre <- fct_lump(spotify$top_genre, n = 5, other_level = "other")
spotify %>% 
  ggplot(aes(y = reorder(top_genre, top_genre, length))) +
  geom_bar(fill = "darkseagreen4") +
  labs(title = "Count of the Top Genres of Spotify Songs", y = "Genre") 

This plot shows that in our data set of top Spotify songs, there is an overwhelmingly amount of songs that go under the dance pop genre. The second genre (besides Other) with the most popular songs is pop. It is evident that the songs in this data is mostly ones that follow into some variation of the pop genre category (ex. Electropop, Canadian Pop, etc). Pop songs have catchy rhythms and lyrics that allow people to sing along. The chorus for these songs are repetitive, have lyrics that easy to remember, and are easy to listen to.

Setting Up Models

It’s time to set up our models! Let’s finally see if we are able to predict a song’s popularity based on these components. First, we will split the data, create a recipe, and then create folds for k-fold cross validation.

Splitting the Data

To begin, we will split the data. 70% of our data will be for training and 30% for testing. We will be stratifying against the pop variable to make sure that the proportions are equal. The purpose of having a training set is pretty intuitive. It will be used to train our model. The testing data will be used to test how well our model did by introducing data that our model has not been exposed to before.

set.seed(100)

# Splitting the data (70% for training and 30% for testing), stratifying against the popularity
spotify_split <- spotify %>%
  initial_split(prop = 0.7, strata = "pop")

# Training Data
spotify_train <- training(spotify_split)

# Testing Data
spotify_test <- testing(spotify_split)

Verifying the split:

nrow(spotify_train)/nrow(spotify)
## [1] 0.6965174
nrow(spotify_test)/nrow(spotify)
## [1] 0.3034826

This shows that there is approximately 70% of our data in the training data and approximately 30% in the testing data.

Recipe Building

Since we will be using the same predictors and response variable in all our models, we will create a general recipe to use. As we build this recipe, we can think of the recipe as the song and all the different components in the song as the ingredients. The different components can be the guitar, drums, singer, producer, etc. Each model could be remixes of the song since we would still be using the same recipe/song but it is applied differently. The categorical variables we have in our data is year but we will exclude this since it is not super relevant to what we are analyzing. The other categorical variable is top_genre but most of the songs are pop songs so genre would not play a significant role in predicting popularity of a song. Other than that, all of our predictors will be used in creating our recipe.

spotify_recipe <- recipe(pop ~ bpm + nrgy + dnce + d_b + live + val + dur + acous + spch, data = spotify_train) %>%
  # Normalizing
  step_normalize(all_numeric_predictors()) %>% 
  # Centering
  step_center(all_predictors()) %>%
  # Scaling
  step_scale(all_predictors())

prep(spotify_recipe) %>%
  bake(new_data = spotify_train) %>%
  kable() %>%
  kable_styling(full_width = F) %>%
  scroll_box(width = "100%", height = "200px")
bpm nrgy dnce d_b live val dur acous spch pop
0.2949367 1.2582683 0.5769283 -0.2889031 1.8633145 1.3997249 1.2366204 -0.6055632 0.2300409 59
1.3646551 0.6324865 -1.1079066 0.2861647 -0.4516048 -1.1304991 0.7837545 -0.5573386 0.0960365 58
1.3646551 0.8827992 -1.0313232 0.8612326 -0.4516048 0.8762303 -1.1409255 1.5645438 2.5081158 58
-0.6036269 -1.8706404 0.5003449 -0.8639710 -0.6009545 -1.8284920 1.3498369 -0.5091140 -0.7079900 57
-0.2613169 1.0079556 1.1895955 0.8612326 -0.6756293 1.4433495 -0.9144926 0.2142550 -0.1719723 56
-3.2137399 -1.9957967 -3.1756585 -0.2889031 -0.3022552 -1.9593656 0.8969710 3.3970786 -0.5739855 56
0.0809930 0.6324865 0.8832619 0.2861647 2.8340871 0.0909884 -1.1126214 -0.7020124 -0.5739855 55
-1.6305566 0.9453774 -1.4908236 0.2861647 0.5938426 0.2654866 2.9348675 -0.3644402 4.1161687 49
0.4233029 1.6963155 0.1940113 1.4363005 -0.6009545 -0.3016326 -0.9711008 -0.6537878 -0.7079900 0
-2.3579651 -2.0583749 -2.6395747 -0.8639710 -0.8249789 -1.1304991 1.3781410 1.1305224 -0.7079900 60
0.6372466 0.0692830 0.0408445 0.2861647 0.4444930 -0.5633799 0.1044557 -0.6537878 -0.5739855 59
-1.4594016 0.2570175 -0.4186559 0.8612326 -0.7503041 0.7889812 -0.7163637 -0.6055632 0.7660585 59
0.0809930 0.6324865 0.5003449 0.2861647 1.4152656 1.6178477 -0.0653690 -0.0750926 -0.3059767 58
-1.6305566 1.2582683 -0.4952393 0.8612326 0.8178670 0.9198548 1.0384916 -0.6055632 -0.5739855 58
0.4233029 0.6324865 1.3427624 1.4363005 -0.9743286 1.6178477 0.8969710 -0.4608894 -0.4399811 54
-1.8445003 -2.2461094 -2.2566576 -0.2889031 -0.4516048 -1.8721165 0.6139298 2.8183834 -0.7079900 50
-1.6305566 0.9453774 -1.4908236 0.2861647 0.5938426 0.2654866 2.9348675 -0.3644402 4.1161687 49
0.2949367 -2.0583749 -2.6395747 -1.4390389 -0.5262797 -1.4358710 0.8403627 3.6864262 -0.7079900 46
0.3805141 0.1944393 0.9598453 -0.8639710 1.6392900 0.3091111 -0.6880596 -0.4126648 1.9720982 38
0.1665704 -0.4313424 0.2705947 -0.8639710 -0.4516048 0.1782375 -0.4899308 -0.7020124 0.4980497 38
0.4660916 1.0705337 0.6535117 0.8612326 1.0418915 0.7453566 -0.6880596 -0.6055632 -0.1719723 31
0.5088804 1.0705337 -1.1079066 0.8612326 0.2951433 -0.6506290 1.5196616 1.1787470 3.9821643 28
0.3377254 0.6324865 0.0408445 0.8612326 3.6555101 0.8762303 0.1610639 -0.7020124 -0.5739855 27
0.2521479 0.7576429 -0.0357389 0.2861647 -0.5262797 0.5708584 0.0761516 -0.0268680 -0.4399811 25
-0.8603593 0.0692830 1.1895955 0.2861647 0.0711189 0.0473638 0.0478474 1.9985652 3.3121423 7
0.3805141 1.5711591 0.5769283 0.8612326 1.4899404 1.8795950 -0.5465390 -0.6537878 0.0960365 59
1.3646551 0.8202211 -0.4952393 -0.8639710 -0.7503041 -0.1707590 0.9252751 0.9376240 -0.4399811 38
-0.1757395 1.3208464 -0.3420725 1.4363005 0.5191678 1.0071040 0.8686669 -0.7020124 -0.5739855 0
-1.7589228 -0.6816551 -0.4952393 -0.2889031 1.8633145 0.5708584 1.4630533 -0.0268680 -0.4399811 60
0.5088804 -2.9970475 -1.4908236 -1.4390389 -0.5262797 -1.2613728 -0.1219773 3.7346508 -0.5739855 60
-1.0743030 0.6324865 -0.7249896 -0.2889031 -0.2275804 -0.7815027 1.2932286 -0.2679910 3.0441335 59
0.8511903 -0.1810297 -0.2654891 -0.8639710 1.8633145 -0.1271344 5.6237586 -0.6537878 2.1061026 58
-1.6305566 0.5073302 -0.7249896 0.8612326 -0.6009545 0.2654866 0.1893680 -0.6537878 -0.4399811 57
0.4233029 0.1318611 0.3471781 0.2861647 1.1165663 0.5708584 0.5290174 -0.6537878 1.1680718 57
-0.6036269 1.4460028 -0.7249896 0.8612326 0.3698181 0.1782375 -0.5465390 -0.6537878 0.6320541 56
0.7656128 -1.6203277 -2.7927415 -1.4390389 -0.5262797 -1.7848674 0.3025845 2.3361374 -0.7079900 55
-0.7747818 0.4447520 -0.1123223 -0.8639710 -1.0490034 -1.0432500 -0.6314514 -0.7020124 -0.5739855 54
-0.4752606 -0.3061861 -0.4186559 0.2861647 3.4314856 -0.8251272 3.6141663 -0.5091140 1.0340673 54
0.4233029 1.0705337 -0.1889057 0.2861647 1.2659159 0.6144830 -0.1219773 -0.6537878 0.2300409 52
1.2790777 1.6963155 0.2705947 2.0113684 0.7431922 1.6614722 -0.7729720 -0.3644402 0.0960365 52
-0.4752606 0.3195957 0.2705947 0.2861647 0.9672167 -0.3016326 -0.7163637 0.6965010 -0.5739855 51
3.5468808 1.5085810 -2.1800742 1.4363005 1.4152656 -0.0398853 -0.4050184 -0.2679910 5.3222084 50
0.4233029 0.2570175 0.5769283 0.8612326 -0.3769300 -0.0835099 0.1044557 -0.5091140 -0.5739855 46
0.4233029 -0.8693896 0.3471781 0.2861647 -0.5262797 0.9198548 0.8969710 -0.6055632 -0.4399811 45
0.4233029 0.5073302 -0.3420725 0.2861647 1.3405908 0.3963602 0.9535792 -0.5091140 -0.5739855 45
-0.4752606 -0.0558734 0.9598453 -0.2889031 -0.8996538 1.1816022 0.0478474 -0.6537878 -0.7079900 43
0.5088804 0.1944393 -0.8781564 -0.2889031 0.4444930 -0.3016326 -1.1126214 -0.7020124 -0.5739855 42
0.5944578 -2.3086876 -1.0313232 -3.1642426 -0.4516048 -1.9593656 0.6988421 0.4071534 -0.7079900 36
-0.6036269 0.4447520 0.0408445 0.2861647 -0.4516048 -0.3016326 1.3781410 -0.0750926 -0.3059767 28
0.0809930 0.5699084 2.1851798 -0.2889031 -0.6009545 1.6614722 -0.6597555 0.6000518 -0.0379679 59
2.2204299 0.5073302 -1.3376568 -2.0141068 0.9672167 -1.3486219 -0.2634979 0.1178058 2.3741114 56
0.2949367 1.4460028 -0.3420725 0.8612326 -0.6009545 0.1346129 -0.0936732 -0.7020124 -0.1719723 56
-0.0901620 -3.5602510 -1.7971572 -2.5891747 -0.6756293 -1.2613728 -0.9994049 3.9757738 -0.5739855 56
0.9367677 -0.5564988 -0.4186559 -0.8639710 -0.0035559 -0.5197554 2.7650427 -0.6537878 -0.4399811 52
0.4233029 0.6950647 1.1895955 0.2861647 1.2659159 0.3963602 -0.7729720 0.1660304 -0.5739855 50
0.0809930 -2.2461094 -0.8781564 -2.0141068 -0.3022552 -1.4358710 0.3874969 0.6482764 -0.7079900 44
1.3218664 -0.8068114 -0.1123223 -0.8639710 1.4899404 0.1782375 -0.0370649 0.6482764 -0.1719723 38
0.5088804 -1.3074368 1.0364287 -0.8639710 -0.5262797 -0.5633799 0.2742804 0.0213566 -0.5739855 37
-1.1170917 0.6950647 -0.6484062 1.4363005 0.0711189 0.2218620 -0.2918020 -0.6055632 -0.4399811 37
-1.3738241 0.1318611 -0.4186559 -0.2889031 1.4899404 -0.2580081 1.1517080 0.6000518 -0.3059767 0
0.2093592 0.6950647 -0.0357389 0.2861647 -0.1529056 0.5272339 -0.1785855 -0.6055632 -0.3059767 60
-1.6305566 -1.4325932 0.3471781 -1.4390389 -0.8996538 -1.0432500 0.1327598 1.0822978 -0.0379679 56
1.6213876 -0.6816551 -1.3376568 -0.2889031 -0.6756293 -2.0902393 1.8876151 0.8893994 -0.4399811 55
-0.7747818 0.6950647 0.0408445 0.2861647 -0.4516048 -0.5197554 0.1327598 -0.4126648 -0.0379679 55
-1.2454579 -0.7442333 -1.2610734 -0.2889031 -0.8996538 -1.0868746 2.3687851 0.2624796 -0.4399811 52
0.2093592 0.5073302 0.1940113 0.2861647 0.5191678 -0.5633799 -0.1785855 -0.5091140 0.0960365 52
-0.7747818 -0.9945460 0.3471781 0.2861647 -0.6009545 -1.1304991 -0.6314514 -0.6537878 -0.7079900 52
-1.7589228 0.3195957 -1.3376568 -0.2889031 -0.3022552 0.0037392 0.0478474 -0.7020124 -0.5739855 51
-1.0743030 -0.2436079 -0.0357389 -0.2889031 -0.7503041 -1.9157411 1.6894863 1.3716454 -0.5739855 49
-0.0473732 -1.7454840 -2.9459083 -3.1642426 -0.5262797 -1.8721165 -0.0087608 2.9148326 -0.7079900 47
0.4233029 0.4447520 -0.5718228 0.8612326 1.1165663 -1.0432500 0.5007133 -0.6055632 -0.7079900 46
0.4233029 0.4447520 1.1895955 -0.2889031 -1.0490034 1.3561004 -0.8295802 -0.6537878 1.3020762 42
0.3377254 0.8202211 -0.1889057 0.2861647 -0.7503041 -0.5633799 -0.0370649 -0.6055632 -0.0379679 41
0.0809930 0.5699084 1.8022628 0.8612326 0.1457937 0.0909884 -0.6597555 -0.3162156 -0.3059767 39
-0.2613169 -2.1209531 -2.7927415 -0.8639710 -0.5262797 -1.9157411 1.3498369 1.4680946 -0.5739855 36
-0.2613169 1.1956901 0.5769283 0.8612326 -0.8996538 1.5742231 0.3591927 -0.6537878 0.6320541 34
-0.3896832 -1.1197023 1.9554296 0.2861647 -0.7503041 0.5708584 0.8403627 -0.7020124 -0.4399811 18
0.3377254 0.4447520 0.5003449 0.2861647 0.8925419 0.5272339 -0.6314514 0.7447256 -0.3059767 58
-0.1757395 0.3821738 0.2705947 0.2861647 -0.1529056 0.8762303 -0.0370649 -0.1715418 0.2300409 57
-1.0315142 -0.8693896 -2.1034908 -0.2889031 -0.3769300 -1.7412428 5.0293722 2.7701588 -0.7079900 57
-1.5449791 -1.0571241 -0.1889057 -0.2889031 -0.3769300 -1.3922464 -1.0843173 2.5772604 -0.0379679 54
0.2093592 -0.0558734 0.4237615 -0.8639710 0.2951433 0.2654866 0.1610639 0.0695812 0.2300409 53
0.3377254 0.5073302 -0.5718228 1.4363005 -0.6009545 -0.1271344 -0.0087608 1.2269716 -0.5739855 51
-0.0045845 0.0067048 0.7300951 0.2861647 -0.3769300 1.2252267 -0.2351937 -0.5091140 -0.4399811 50
0.5088804 1.5085810 -0.1123223 0.2861647 -0.6009545 0.9198548 -0.8012761 -0.7020124 -0.5739855 48
-0.3468944 -4.1860328 -0.9547398 -5.4645141 -0.6756293 -1.1304991 2.3970892 4.0722230 -0.5739855 47
-1.2026692 0.2570175 1.4193458 0.8612326 0.1457937 -0.8687518 1.0667957 0.3589288 0.4980497 47
-0.1757395 1.0079556 0.0408445 0.2861647 1.4152656 -0.3452572 0.2459763 -0.6537878 -0.3059767 46
0.2521479 0.3195957 -0.5718228 -0.2889031 -1.0490034 -0.5197554 -0.6031473 -0.7020124 -0.3059767 36
0.1237817 -0.0558734 0.8066785 -0.2889031 -0.5262797 -0.3016326 -0.6314514 -0.6537878 -0.1719723 34
0.2093592 0.0067048 1.4193458 -0.8639710 0.9672167 1.3561004 0.6988421 -0.4608894 1.3020762 31
2.9050497 0.6324865 0.4237615 1.4363005 1.4152656 0.7453566 -0.0087608 -0.2679910 -0.1719723 29
0.2521479 -1.1197023 0.7300951 -0.2889031 1.2659159 -1.6976183 -1.0277090 -0.7020124 -0.5739855 59
2.0920637 -1.3074368 -2.1034908 -2.0141068 0.1457937 -1.3486219 -1.2824461 -0.4126648 0.4980497 59
-0.3468944 1.1956901 0.7300951 2.0113684 -0.7503041 0.5708584 -1.1692296 -0.3644402 -0.4399811 55
-0.7747818 -0.2436079 -1.0313232 -2.0141068 1.3405908 -1.0868746 1.0950998 2.0950144 -0.3059767 54
-0.1329507 0.7576429 0.4237615 -0.2889031 1.3405908 0.4836093 0.5573216 0.6482764 2.5081158 54
-0.5608381 -0.7442333 0.9598453 -2.5891747 -0.5262797 -0.8251272 -0.0087608 -0.2197664 -0.4399811 54
-0.2185282 -1.1197023 0.2705947 -0.2889031 -1.0490034 1.3997249 -0.6597555 -0.0268680 2.6421202 53
1.7069650 0.6324865 -0.5718228 0.2861647 -1.0490034 1.1379776 -0.0653690 -0.6055632 -0.4399811 51
-0.9887255 0.8202211 -0.3420725 -0.8639710 -0.5262797 -0.2143835 0.4158010 -0.6055632 -0.1719723 49
-1.3310354 -1.5577495 -2.8693249 0.8612326 -0.3022552 -1.3486219 1.7177904 3.3970786 -0.7079900 44
0.0809930 0.3821738 0.8832619 -0.2889031 -0.7503041 1.2688513 -0.6597555 -0.3162156 -0.4399811 43
1.2790777 1.0705337 -1.6439904 0.2861647 0.2204685 -1.1741237 1.6045739 -0.7020124 -0.5739855 41
0.5088804 -0.1184515 -0.1123223 0.8612326 -0.4516048 -1.1741237 0.0478474 0.0695812 0.0960365 57
-1.7589228 -1.4325932 -0.4952393 -0.2889031 -0.6009545 -0.4325063 -0.3484102 -0.6055632 -0.7079900 57
0.0809930 -0.9945460 1.1130121 -0.8639710 -0.3769300 -1.2613728 0.0195433 -0.6537878 -0.4399811 57
1.7925425 0.8202211 -0.4952393 0.2861647 -0.6009545 -0.0835099 -0.9994049 -0.0750926 1.8380938 52
-0.6892043 0.3821738 -0.4952393 0.2861647 -0.6756293 1.0507285 -1.2258379 -0.4608894 -0.0379679 44
-0.8175705 0.5699084 -0.1123223 -0.8639710 1.3405908 -0.6506290 1.3781410 -0.7020124 -0.4399811 69
0.5088804 -0.4939206 1.1895955 -0.8639710 1.0418915 -0.5197554 1.6611822 -0.7020124 -0.1719723 68
0.2949367 -0.1810297 0.6535117 0.2861647 -0.6756293 0.9634794 -0.2918020 -0.7020124 -0.7079900 66
1.1935002 1.5085810 -0.8781564 0.8612326 0.7431922 0.5708584 0.3591927 0.5518272 -0.4399811 65
1.2790777 0.7576429 -1.2610734 -0.2889031 1.4899404 0.9634794 -0.3767143 -0.7020124 -0.5739855 65
0.0809930 0.5699084 0.5769283 0.2861647 -0.3769300 0.3091111 0.0761516 -0.6055632 -0.5739855 63
0.2949367 0.2570175 1.1130121 0.8612326 -0.0035559 -0.5197554 0.2742804 -0.7020124 -0.4399811 62
0.7228241 0.0067048 0.4237615 2.0113684 -0.3022552 1.1816022 -0.3484102 -0.4608894 -0.4399811 62
0.8511903 1.1331119 0.2705947 0.8612326 1.3405908 1.3561004 -0.9427967 -0.6537878 -0.0379679 62
0.0809930 0.8202211 0.8066785 1.4363005 1.7886397 0.0037392 -1.5088790 -0.3162156 0.4980497 62
-1.1170917 -1.1822805 -0.3420725 -0.8639710 0.9672167 -0.4761308 0.9252751 0.1660304 3.8481599 62
-0.3041057 0.6950647 -1.1079066 0.2861647 -0.3769300 0.1346129 0.6705380 0.0695812 0.3640453 62
0.5088804 -0.4939206 1.2661790 0.2861647 -0.3769300 1.2252267 -0.8295802 -0.3162156 -0.4399811 61
1.9636975 -1.8706404 -2.7161581 0.2861647 -0.3769300 -1.9593656 1.0950998 3.3006294 -0.7079900 61
0.5088804 0.4447520 0.6535117 0.2861647 -0.8249789 0.2218620 1.6894863 -0.1715418 -0.4399811 69
0.5088804 1.0705337 0.5003449 1.4363005 -0.6756293 0.7889812 0.4724092 -0.7020124 -0.5739855 69
0.4233029 0.3821738 -0.4952393 -0.2889031 -0.5262797 -0.6942536 2.7084345 -0.7020124 -0.5739855 67
0.5944578 0.0067048 0.3471781 -0.2889031 0.1457937 -0.3016326 0.3591927 -0.6055632 -0.1719723 65
0.5088804 -0.9945460 0.4237615 -0.8639710 1.1165663 1.1816022 -0.4333226 -0.7020124 -0.5739855 64
0.5516691 1.3834246 0.1174279 0.8612326 0.6685174 0.0473638 0.6705380 -0.7020124 -0.1719723 63
0.3377254 1.1331119 0.0408445 0.8612326 -0.4516048 -0.8687518 0.1893680 -0.5573386 -0.4399811 63
0.3805141 -0.9945460 0.8832619 0.2861647 -1.0490034 1.1379776 0.1893680 -0.6537878 -0.7079900 69
0.4233029 0.8827992 0.5769283 -0.2889031 0.2951433 -0.3452572 0.5007133 -0.7020124 0.0960365 68
1.7925425 -0.1810297 -1.0313232 0.2861647 1.5646152 0.2218620 -0.1219773 -0.3644402 -0.5739855 68
0.6800353 0.5699084 0.3471781 0.8612326 -0.1529056 1.5742231 0.0478474 -0.7020124 -0.5739855 67
0.0809930 0.5699084 0.2705947 -0.2889031 -0.3769300 1.1379776 -0.4050184 -0.5573386 -0.5739855 65
2.8194723 -0.4313424 -2.0269074 -0.2889031 -0.7503041 0.9634794 0.7271463 -0.6055632 -0.0379679 63
-0.9887255 -1.1197023 1.0364287 -0.8639710 -0.9743286 0.6144830 0.3025845 -0.4126648 2.7761247 69
0.3377254 0.2570175 -0.9547398 -0.2889031 -0.3022552 -1.5667446 -0.2918020 0.5518272 -0.5739855 69
-1.5449791 1.0705337 0.0408445 1.4363005 -0.7503041 1.4869740 0.0478474 0.6965010 0.3640453 68
-0.6892043 -0.6816551 1.1895955 0.8612326 -0.1529056 -0.6942536 2.8499551 -0.2197664 1.8380938 68
0.3377254 1.2582683 0.5003449 1.4363005 -0.2275804 1.3124758 -1.9334408 -0.6537878 -0.7079900 67
0.5088804 1.0079556 0.5769283 1.4363005 3.5808353 0.7017321 -0.4616267 -0.5091140 -0.5739855 67
0.0809930 -0.2436079 0.1174279 0.2861647 0.5938426 -0.6506290 -0.0087608 -0.7020124 -0.7079900 66
0.0809930 0.4447520 0.2705947 -0.2889031 -0.4516048 -0.5197554 0.1327598 -0.0750926 -0.5739855 65
0.1237817 0.1944393 1.5725126 0.2861647 2.7594123 1.0943531 -0.0370649 -0.7020124 0.0960365 63
0.3377254 0.6950647 -0.6484062 1.4363005 -0.6009545 -0.3888817 -0.6597555 -0.6537878 -0.3059767 62
-0.5608381 -1.1822805 -0.6484062 -0.8639710 1.1912411 0.2654866 1.0667957 -0.0268680 0.2300409 61
0.9367677 -1.0571241 0.5769283 -0.8639710 -0.6009545 0.5272339 0.5290174 1.4680946 -0.5739855 68
1.1079227 -1.3700150 -1.5674070 -0.2889031 -0.5262797 -1.1741237 0.7271463 -0.0750926 -0.5739855 68
0.3377254 0.2570175 -0.6484062 0.2861647 0.2204685 -0.0835099 0.1044557 1.5163192 -0.7079900 67
-1.5449791 -0.5564988 0.4237615 -0.8639710 -0.1529056 0.8762303 -0.9144926 -0.3162156 2.2401070 67
0.0809930 1.3834246 0.5769283 0.8612326 -0.7503041 0.9634794 -0.0087608 0.1178058 -0.3059767 66
-0.9459368 -0.4313424 0.3471781 -0.2889031 -0.2275804 1.2252267 -0.6031473 -0.6055632 -0.4399811 66
0.3377254 1.3208464 0.5003449 0.8612326 0.6685174 2.0104687 0.1893680 -0.7020124 -0.5739855 65
2.9050497 -1.6829059 -1.7971572 -3.7393104 0.9672167 -1.5231201 2.6518263 -0.5573386 2.7761247 65
0.3377254 -0.2436079 0.5769283 0.2861647 -0.5262797 1.3997249 -0.2918020 -0.2679910 -0.1719723 65
-0.3468944 1.3208464 -0.1889057 1.4363005 -0.4516048 0.2218620 0.2176721 -0.6537878 -0.5739855 65
0.4233029 0.1944393 -0.3420725 1.4363005 -0.6756293 -0.2580081 -0.2351937 -0.1715418 -0.5739855 64
-1.5021904 0.0067048 -1.3376568 0.2861647 -0.1529056 -1.3049973 0.7554504 -0.3162156 -0.7079900 62
0.8511903 0.6324865 0.3471781 0.8612326 -0.3022552 1.3124758 -0.5182349 -0.1233172 -0.1719723 62
3.1617822 0.3821738 -2.2566576 0.8612326 -0.4516048 -2.0902393 -0.9711008 -0.4126648 -0.4399811 61
0.8511903 -3.4976729 -1.4908236 -2.0141068 -0.6756293 -1.8721165 0.1044557 3.4935278 -0.7079900 61
-1.1598805 1.1331119 -0.5718228 0.2861647 1.1165663 -0.9123764 0.2176721 -0.2197664 -0.0379679 69
1.3218664 0.5699084 -1.1844900 0.8612326 -0.8996538 -0.1707590 -0.7446679 -0.6537878 -0.4399811 69
-0.1329507 0.8827992 0.3471781 0.2861647 -0.6009545 0.5272339 -0.0370649 -0.6055632 -0.7079900 69
-1.5449791 -0.0558734 -0.1123223 0.2861647 -0.6009545 0.5272339 -0.2351937 1.9985652 2.1061026 69
0.8939790 0.8827992 -2.4864079 -0.2889031 2.2366886 -1.9157411 -0.3484102 -0.6537878 -0.0379679 69
0.5088804 0.8827992 -0.4952393 1.4363005 -0.3022552 -0.6506290 0.4158010 -0.6537878 -0.3059767 68
0.0809930 0.6950647 -2.0269074 0.8612326 -0.7503041 -0.8251272 -0.7446679 -0.3162156 0.2300409 68
0.0809930 0.8827992 -0.4186559 0.2861647 0.6685174 -0.7815027 0.2742804 -0.3644402 -0.5739855 67
0.2093592 1.4460028 0.3471781 0.2861647 -0.9743286 1.1816022 0.2742804 -0.2679910 -0.3059767 67
0.2521479 1.4460028 0.2705947 0.8612326 -0.4516048 0.5708584 0.1327598 0.1660304 -0.3059767 67
0.3805141 0.6324865 -0.8781564 -0.2889031 -0.6009545 -1.3486219 0.5573216 -0.3162156 -0.3059767 67
0.1665704 0.6324865 -0.1889057 0.2861647 0.2951433 0.3527357 -0.5748432 -0.6537878 -0.5739855 66
-0.6464156 1.0079556 1.2661790 0.2861647 -0.9743286 0.9634794 -1.5371832 -0.6537878 0.3640453 66
0.0809930 0.5073302 0.8066785 -0.8639710 -0.6756293 1.5742231 0.2742804 -0.4126648 -0.7079900 66
0.6800353 1.0705337 1.2661790 0.8612326 -0.5262797 1.9232196 -1.0277090 -0.4608894 -0.4399811 65
1.5785988 0.0067048 0.4237615 -0.8639710 -0.6756293 0.9634794 -0.3484102 0.5518272 -0.4399811 65
-0.5180494 -0.2436079 1.1130121 0.8612326 0.7431922 1.4433495 -0.7729720 0.3107042 -0.5739855 65
0.4660916 1.5085810 0.5003449 0.8612326 0.5938426 -0.5197554 -0.4333226 -0.7020124 -0.4399811 64
0.2949367 -2.1835312 0.9598453 -3.1642426 -0.6009545 1.0507285 -0.6031473 3.1077310 -0.5739855 64
-0.1757395 -2.8093130 -2.0269074 -2.0141068 -0.0782308 0.0037392 -2.5844355 3.5417524 -0.7079900 63
0.2093592 0.2570175 0.1940113 -0.8639710 0.0711189 -0.1707590 0.4724092 -0.0268680 -0.5739855 62
1.3646551 0.2570175 0.5003449 0.8612326 1.4899404 0.9198548 -0.7446679 0.6000518 0.0960365 61
0.1665704 -0.0558734 0.5003449 -1.4390389 -0.8249789 0.3527357 -1.8768326 -0.6055632 -0.7079900 61
-0.6892043 -0.1184515 0.1940113 -0.8639710 -0.7503041 -1.3922464 -0.7446679 0.7929502 0.3640453 69
0.8939790 0.8827992 -2.4864079 -0.2889031 2.2366886 -1.9157411 -0.3484102 -0.6537878 -0.0379679 69
0.2949367 -0.0558734 -0.4186559 0.8612326 -0.6756293 -0.3888817 -0.6031473 -0.2197664 -0.4399811 66
-0.7319931 -0.4313424 -0.7249896 -0.2889031 -0.5262797 -0.3452572 -0.6314514 -0.4608894 -0.4399811 66
-0.7747818 -0.1184515 -0.7249896 -0.2889031 0.6685174 -0.7378781 -1.5654873 -0.2679910 -0.5739855 66
-1.3310354 0.1944393 -0.1123223 0.2861647 0.7431922 -1.1304991 -0.8861884 -0.7020124 -0.5739855 65
-1.2026692 0.5699084 -0.4186559 0.2861647 -0.7503041 0.9198548 -0.0087608 -0.4126648 1.8380938 64
-0.8603593 1.1331119 -0.2654891 0.2861647 -0.6756293 -0.8687518 -0.9427967 -0.4608894 -0.4399811 64
0.0382042 0.0692830 -0.6484062 -0.2889031 -0.9743286 -1.1741237 0.7271463 -0.6537878 -0.4399811 63
-0.3468944 1.0705337 -0.3420725 0.8612326 -0.1529056 -0.8687518 0.0761516 -0.6055632 0.4980497 63
1.9636975 -1.1822805 0.3471781 -1.4390389 -0.0782308 0.1782375 -0.0653690 -0.5091140 0.2300409 61
-0.5180494 -1.3074368 1.7256794 0.2861647 -0.7503041 0.1346129 -0.6597555 0.2624796 -0.4399811 69
0.1665704 0.6324865 0.0408445 -0.2889031 -0.3769300 0.2654866 -1.0277090 -0.1233172 -0.3059767 69
-0.5180494 0.5073302 1.1895955 -0.2889031 0.2951433 0.0037392 0.0761516 -0.0750926 -0.3059767 68
0.7228241 -0.8693896 0.8832619 -2.0141068 0.1457937 -1.6539937 1.0101874 -0.6055632 0.6320541 67
-0.4324719 1.3834246 0.5769283 0.8612326 -0.9743286 0.0909884 -0.5748432 -0.5573386 -0.4399811 66
0.0809930 0.1944393 0.9598453 -0.2889031 -0.6756293 1.4869740 -0.6597555 -0.4608894 -0.5739855 66
-1.3738241 -1.8080622 0.7300951 -0.8639710 -0.8996538 1.4433495 -1.0560132 -0.6537878 -0.4399811 64
0.1665704 1.0079556 -0.7249896 0.2861647 -0.3022552 -0.8251272 0.6705380 -0.7020124 -0.3059767 62
-0.5608381 0.0692830 0.8832619 -0.2889031 0.3698181 -0.2143835 -0.4333226 -0.6055632 -0.0379679 68
-0.9887255 0.1318611 2.1085964 -0.8639710 -0.4516048 -1.3922464 -0.1502814 -0.7020124 0.0960365 68
-0.2185282 0.5699084 -0.8015730 0.2861647 1.7886397 -1.4794955 0.5007133 -0.6055632 -0.7079900 67
0.3377254 -0.6816551 0.1174279 0.2861647 -0.5262797 -0.3016326 -0.9711008 -0.2197664 -0.3059767 67
-0.1757395 -0.7442333 -1.9503240 -0.2889031 -0.6756293 -1.4358710 1.3215327 -0.5573386 -0.3059767 66
-1.5449791 -1.6829059 -1.2610734 -2.5891747 3.4314856 0.4399848 2.2272645 3.2041802 2.1061026 65
-1.0315142 -0.1184515 0.6535117 0.8612326 -0.6009545 -0.4325063 -0.1219773 -0.6055632 -0.4399811 64
-0.7747818 1.0079556 0.8832619 0.8612326 -0.8249789 1.6614722 -1.5937914 -0.5573386 -0.4399811 64
-0.7747818 -0.4313424 0.8066785 0.2861647 -0.5262797 -0.3016326 -1.1126214 0.5036026 0.0960365 64
1.3646551 -1.6829059 0.4237615 -2.0141068 -0.3769300 -1.2613728 -0.3484102 -0.4126648 -0.3059767 67
-1.0743030 -2.1209531 -1.2610734 -1.4390389 -0.4516048 -1.6539937 -0.2634979 2.8666080 -0.7079900 76
0.3377254 0.0692830 1.1130121 0.8612326 -0.8249789 0.3963602 0.2742804 -0.0750926 -0.5739855 73
1.1507114 0.7576429 -0.1889057 0.2861647 -0.6009545 -0.1707590 0.1327598 0.8893994 -0.5739855 73
0.5088804 0.6950647 0.9598453 0.2861647 3.8795346 0.4836093 -0.3484102 0.1660304 -0.4399811 73
0.2949367 0.8202211 -1.5674070 -1.4390389 -0.4516048 1.1379776 -0.2351937 -0.6537878 4.9201952 72
0.5088804 0.2570175 1.0364287 -0.2889031 1.3405908 -0.8251272 1.2366204 0.2624796 -0.7079900 72
0.3805141 1.1956901 0.6535117 0.8612326 1.4152656 1.0507285 0.3025845 -0.7020124 0.7660585 76
0.4233029 0.3195957 0.5769283 0.8612326 1.0418915 0.4399848 -0.6880596 -0.6537878 -0.4399811 76
-1.0315142 0.2570175 0.0408445 0.2861647 3.1327864 0.5708584 1.1800121 -0.3162156 0.0960365 76
-0.7747818 0.4447520 0.3471781 0.2861647 -0.1529056 1.2688513 -0.7163637 -0.3162156 0.4980497 75
1.3646551 1.3834246 -0.4952393 0.8612326 -0.3022552 1.5742231 -0.1219773 -0.7020124 -0.4399811 74
0.2521479 0.7576429 -0.4186559 0.2861647 1.1165663 -0.1271344 0.9818833 -0.7020124 1.0340673 73
0.2949367 -0.1184515 -0.1123223 -0.2889031 3.6555101 -1.8284920 4.3500733 -0.7020124 1.5700850 73
-0.8175705 -0.3687642 -0.7249896 0.8612326 -0.6756293 -0.9560009 0.8120586 -0.0268680 -0.7079900 73
0.5088804 0.1944393 0.8066785 0.8612326 0.6685174 -0.7378781 1.0384916 -0.6055632 1.0340673 72
0.0809930 0.6950647 0.7300951 0.2861647 -0.4516048 0.5708584 -0.5748432 -0.6537878 -0.5739855 72
-0.9887255 -0.1184515 -0.1123223 -1.4390389 -0.3769300 0.5708584 0.7837545 -0.3162156 -0.4399811 72
-1.0315142 -0.4939206 0.1174279 -2.0141068 -0.5262797 -1.1304991 -0.1785855 0.2142550 1.3020762 71
0.0809930 -0.8068114 1.0364287 -0.8639710 -0.5262797 0.6144830 -0.9144926 -0.6537878 -0.5739855 76
-0.0473732 -0.1810297 1.6490960 0.8612326 -0.8249789 1.7487213 -1.0560132 -0.3162156 -0.4399811 76
-0.4324719 -0.1810297 -0.2654891 0.8612326 -0.3769300 -0.2143835 0.4724092 -0.7020124 -0.7079900 76
0.2949367 0.0692830 0.8066785 2.0113684 0.5191678 1.0071040 -0.4050184 -0.0268680 -0.0379679 73
0.3377254 1.0079556 -0.6484062 0.8612326 1.4152656 0.7017321 -0.5465390 -0.6055632 0.7660585 73
0.3805141 0.8827992 0.5769283 0.8612326 -0.5262797 -0.3016326 -0.7729720 -0.5573386 -0.3059767 72
0.4233029 0.8202211 -0.3420725 0.2861647 -0.8249789 0.0473638 0.0761516 -0.7020124 -0.3059767 72
0.0809930 0.9453774 0.1940113 1.4363005 1.1912411 0.5708584 0.0478474 -0.6537878 -0.4399811 72
-0.4324719 0.1944393 0.1940113 -0.8639710 -0.6756293 -0.5633799 1.4630533 -0.6055632 -0.4399811 72
-0.9031480 -0.9945460 0.5769283 -0.2889031 -0.3769300 -0.8251272 -1.5371832 -0.5091140 -0.4399811 71
0.5088804 1.3208464 0.2705947 0.2861647 -0.8249789 1.0943531 -0.2634979 -0.7020124 -0.5739855 71
2.3915849 0.5073302 -1.1079066 0.2861647 0.4444930 -0.0835099 -0.4050184 -0.3644402 0.2300409 76
0.4233029 -0.0558734 0.5003449 0.2861647 0.2951433 1.2252267 -0.3767143 -0.7020124 -0.4399811 75
0.4233029 0.4447520 -1.0313232 1.4363005 -0.8249789 -1.4794955 1.2932286 -0.5091140 -0.1719723 75
0.3377254 0.6324865 -0.5718228 -0.2889031 -0.3769300 -1.7848674 -0.2918020 -0.6055632 -0.3059767 74
-0.7747818 -1.8706404 0.9598453 -0.8639710 -0.8249789 -0.9996255 0.0761516 -0.7020124 0.2300409 73
-0.0045845 1.0705337 0.0408445 1.4363005 -0.7503041 -0.1271344 -0.7163637 -0.6055632 -0.4399811 73
0.4233029 0.6950647 -0.1123223 -0.8639710 -0.8249789 1.4433495 0.6422339 0.0695812 1.0340673 71
2.8622610 0.8202211 -0.7249896 0.8612326 -1.0490034 1.0507285 -0.1219773 -0.6537878 -0.4399811 71
1.1935002 1.3834246 -0.1123223 0.8612326 -0.6009545 1.5742231 0.9252751 -0.5573386 -0.0379679 70
0.7656128 0.3195957 0.1940113 0.8612326 1.1912411 0.0909884 0.1327598 -0.5091140 1.0340673 70
0.2949367 -0.1810297 0.1174279 -0.8639710 1.4152656 -1.1741237 0.7554504 0.5036026 -0.5739855 70
-0.6464156 0.6324865 0.1174279 0.2861647 -0.1529056 0.4836093 -0.8861884 -0.6055632 0.9000629 75
0.3805141 0.0692830 0.2705947 -0.2889031 0.6685174 0.2218620 -0.3201061 -0.3162156 -0.5739855 73
-0.7747818 -0.8068114 0.3471781 -0.8639710 0.9672167 -0.4325063 -0.9144926 0.1178058 0.0960365 73
0.0809930 -0.3061861 -0.4952393 0.2861647 -0.5262797 -1.3922464 1.5196616 -0.1715418 -0.7079900 72
-1.3310354 0.4447520 -0.6484062 0.2861647 -0.5262797 -0.8251272 0.1610639 0.7929502 -0.5739855 71
1.7925425 1.5085810 -1.1844900 0.8612326 1.1165663 -0.3452572 -0.6597555 -0.7020124 0.6320541 71
-0.9887255 0.0692830 2.0320130 0.8612326 -0.9743286 -0.6070045 -0.7163637 -0.2679910 -0.1719723 70
-0.8175705 0.0067048 -1.3376568 0.8612326 -0.7503041 -1.2177482 0.3025845 -0.6537878 -0.7079900 76
3.0762047 0.1944393 -2.7927415 -0.2889031 3.0581116 -0.8251272 0.1610639 -0.7020124 0.0960365 76
0.4233029 0.9453774 -1.7971572 0.8612326 1.1912411 -0.7378781 -0.3484102 -0.5573386 -0.0379679 76
-0.7747818 0.3821738 1.8022628 0.8612326 -0.4516048 0.3091111 -0.0087608 -0.4126648 0.0960365 75
0.5088804 -0.0558734 0.3471781 0.2861647 0.1457937 -1.0432500 -0.2918020 -0.6537878 -0.4399811 75
-0.9887255 -0.6190769 1.2661790 -0.8639710 3.4314856 1.4433495 -0.1502814 -0.6537878 -0.1719723 74
0.5088804 0.5073302 -1.7205738 0.2861647 -0.2275804 -0.0835099 0.8969710 -0.6537878 -0.4399811 73
0.1237817 -0.7442333 1.1895955 -0.8639710 0.0711189 0.1346129 0.0761516 0.0695812 -0.5739855 73
1.7069650 -1.6203277 -1.2610734 -0.2889031 -0.6756293 -0.9996255 1.9725274 0.9376240 -0.7079900 73
-1.6733453 0.4447520 0.1174279 0.2861647 1.0418915 1.3561004 -1.1692296 1.1305224 1.4360806 72
0.7656128 0.0067048 0.8066785 0.2861647 -0.3769300 0.1782375 -0.3201061 -0.6537878 0.6320541 72
-1.3738241 0.1318611 -0.3420725 0.8612326 1.1165663 0.0473638 -0.9144926 0.4553780 0.9000629 72
2.4771623 0.5699084 -0.1123223 -0.8639710 -0.3022552 1.7050968 -0.2068896 0.3107042 1.0340673 72
-0.9887255 0.5699084 -0.4186559 0.2861647 -0.7503041 -0.3888817 -0.4899308 -0.0750926 -0.3059767 72
0.0809930 -0.2436079 1.3427624 0.8612326 -0.8996538 1.8359704 0.2742804 0.7447256 -0.4399811 71
-1.0743030 0.0067048 0.2705947 0.8612326 1.9379893 -0.3016326 -0.0936732 -0.6537878 -0.0379679 70
0.0809930 0.5699084 -0.2654891 -1.4390389 0.5938426 0.0037392 -0.2068896 -0.6537878 -0.4399811 76
-0.0045845 -0.3687642 1.1130121 -1.4390389 -0.6009545 -0.5633799 1.0667957 -0.4608894 0.2300409 74
-0.7747818 0.5699084 0.5003449 -0.8639710 0.9672167 -0.7815027 0.0478474 -0.5091140 -0.3059767 74
-1.0743030 0.5699084 -0.2654891 0.2861647 -0.1529056 -0.0835099 -1.1409255 -0.3644402 -0.5739855 73
0.2521479 -0.1184515 2.1851798 -0.2889031 2.2366886 1.3997249 -1.2541420 -0.2197664 0.2300409 73
-0.4324719 1.0705337 0.8066785 0.8612326 0.3698181 1.0071040 -1.6787037 -0.5091140 -0.1719723 73
0.2521479 -0.5564988 0.0408445 0.2861647 -0.3022552 -0.5197554 0.2459763 0.5036026 -0.0379679 72
-1.2026692 -0.7442333 0.1174279 -2.0141068 -0.4516048 -0.9560009 -0.8861884 -0.2197664 -0.7079900 72
-0.3041057 -1.3074368 1.2661790 -0.8639710 1.1912411 -0.2143835 -0.6880596 -0.6537878 0.4980497 71
1.9209087 -0.0558734 -0.1123223 -0.2889031 -0.5262797 -0.0835099 -0.2918020 -0.6537878 0.9000629 71
-1.1170917 0.5699084 -0.6484062 0.8612326 4.1782338 0.5708584 -0.3201061 -0.6537878 1.8380938 70
0.0809930 1.0705337 0.8832619 0.2861647 -0.6009545 -0.5197554 -0.2068896 -0.6537878 -0.5739855 70
3.1617822 0.1318611 0.1940113 -0.2889031 -0.8996538 0.9198548 -0.4333226 -0.6537878 2.5081158 76
-0.5180494 -0.4313424 -0.2654891 -1.4390389 -0.4516048 -1.6976183 -0.0936732 -0.1233172 -0.3059767 76
-0.7747818 -0.4313424 0.0408445 -0.8639710 -0.6756293 -1.1741237 -0.0936732 -0.6055632 -0.7079900 76
-1.4166129 -0.5564988 -2.4098245 -0.2889031 -0.1529056 0.3527357 -0.2634979 0.8893994 3.9821643 75
-0.6036269 0.5073302 0.7300951 0.8612326 -0.9743286 0.2218620 -1.1692296 -0.2197664 -0.5739855 75
-0.7747818 0.0692830 0.3471781 -0.2889031 -0.9743286 -0.4761308 0.2459763 -0.0268680 0.3640453 72
-0.8175705 -2.1209531 -0.8781564 -3.1642426 -0.5262797 -1.2613728 1.3498369 1.6609930 -0.7079900 72
-1.0315142 1.2582683 -0.8781564 1.4363005 -0.6009545 0.4399848 -1.0277090 0.0213566 0.0960365 71
-1.2026692 -0.7442333 -0.1123223 -0.8639710 -0.6009545 0.7017321 -0.8861884 0.2624796 0.3640453 71
-0.7747818 -1.2448587 0.4237615 0.2861647 -0.6756293 0.0037392 -0.4050184 -0.3162156 -0.7079900 71
0.2949367 0.0692830 2.1851798 -0.2889031 1.2659159 -0.2580081 -0.0370649 -0.3644402 0.3640453 70
-0.5608381 1.0079556 -1.8737406 0.8612326 1.3405908 -0.2143835 -0.6880596 0.1178058 3.0441335 70
0.2521479 1.0705337 0.3471781 0.8612326 -0.2275804 -0.1271344 -0.9427967 -0.5091140 -0.3059767 70
0.2949367 1.0079556 0.5003449 1.4363005 -0.4516048 0.8762303 -0.6314514 -0.5573386 -0.5739855 70
-0.8603593 -0.2436079 0.1940113 1.4363005 -0.4516048 0.4836093 -0.6031473 -0.5573386 -0.1719723 76
-0.6464156 -1.5577495 0.4237615 -1.4390389 -0.9743286 -0.1707590 -0.5748432 -0.7020124 -0.0379679 76
-0.7747818 0.6324865 0.1174279 -0.2889031 -0.9743286 -0.2580081 -0.1785855 -0.5573386 -0.0379679 76
-1.2026692 -0.8068114 0.1174279 -1.4390389 -0.4516048 -0.3016326 -0.9144926 -0.1715418 -0.4399811 76
0.4233029 0.0067048 0.9598453 -0.2889031 -0.3769300 -0.0398853 -0.3767143 0.2624796 0.4980497 75
-0.4752606 0.5699084 -0.1123223 0.8612326 -0.6009545 -0.8687518 -0.2918020 -0.5091140 -0.3059767 74
-0.9031480 -0.4939206 0.5003449 -0.8639710 -0.7503041 -0.6506290 1.5196616 -0.2197664 -0.0379679 74
0.0809930 -0.8693896 -0.9547398 0.2861647 -0.5262797 -1.3486219 3.2745169 -0.5573386 -0.7079900 72
-1.0743030 0.6324865 -0.0357389 0.2861647 -0.7503041 1.7050968 -0.8012761 -0.6055632 3.3121423 72
2.1348524 -0.8693896 -1.4142402 -0.2889031 -0.6009545 -1.4358710 -1.1409255 0.4553780 -0.4399811 72
-1.2026692 -0.7442333 -0.1123223 -0.8639710 -0.6009545 0.7017321 -0.8861884 0.2624796 0.3640453 71
-1.0315142 1.0079556 0.7300951 1.4363005 -1.0490034 0.3963602 -1.2541420 0.1178058 -0.4399811 76
-0.6036269 -0.3061861 -0.2654891 -0.8639710 0.1457937 -1.5667446 -1.3956626 -0.6537878 -0.7079900 75
-0.9887255 0.5073302 0.8066785 -0.2889031 -0.8249789 0.3963602 -0.5465390 0.3107042 0.4980497 75
0.7656128 0.3195957 -0.8781564 0.2861647 -0.6756293 0.5708584 0.9818833 -0.3644402 3.4461467 70
-0.9031480 1.1331119 0.1940113 0.8612326 -0.7503041 1.2252267 -0.2351937 0.2142550 -0.5739855 83
0.0382042 1.3208464 0.4237615 0.8612326 -0.7503041 0.8326057 1.9725274 -0.7020124 -0.5739855 79
-0.3896832 0.8202211 -0.0357389 0.2861647 -0.6756293 -0.3888817 -0.1219773 -0.6055632 -0.5739855 78
-2.2723877 0.9453774 0.6535117 0.2861647 -0.5262797 0.0909884 -0.3201061 -0.5091140 0.7660585 77
0.0809930 0.4447520 0.8066785 0.8612326 -1.0490034 1.3124758 -0.6314514 -0.7020124 0.0960365 77
0.8939790 -1.8706404 -1.7205738 -0.8639710 -0.5262797 -1.5667446 1.6894863 0.7929502 -0.7079900 81
0.7228241 -2.3712658 -0.6484062 -1.4390389 -0.6009545 -1.0432500 1.6894863 3.5899770 -0.7079900 80
0.4660916 1.4460028 0.1940113 1.4363005 0.8925419 0.0473638 0.7554504 0.2142550 1.0340673 79
0.3377254 0.5073302 -0.3420725 0.8612326 -0.3769300 -0.9560009 0.5573216 -0.3644402 0.2300409 80
1.1079227 -0.0558734 0.6535117 0.8612326 0.9672167 1.5305986 0.2176721 -0.4608894 -0.5739855 79
0.9367677 -0.7442333 -1.4908236 -0.8639710 -0.7503041 -1.3922464 1.5196616 -0.4608894 -0.7079900 79
-0.3468944 0.2570175 0.7300951 0.2861647 0.8178670 0.1346129 0.1610639 -0.6055632 -0.5739855 79
0.2949367 0.5073302 0.6535117 2.0113684 -0.8996538 1.6178477 -0.7163637 -0.6537878 -0.1719723 78
-1.7589228 -1.4951713 -0.1889057 -0.8639710 -1.1236782 0.7017321 -0.1502814 -0.7020124 -0.5739855 77
0.2521479 0.4447520 -0.8781564 -0.2889031 -0.1529056 0.5272339 0.6139298 -0.7020124 -0.4399811 85
0.1237817 -0.3061861 -0.3420725 -0.2889031 -0.4516048 -0.9996255 0.5573216 0.4071534 -0.4399811 81
-0.9887255 -0.9945460 1.0364287 -0.8639710 -0.3769300 -0.3452572 0.5007133 0.9858486 -0.4399811 81
-1.4166129 1.0079556 -1.7205738 0.8612326 -0.4516048 0.4836093 -0.6314514 -0.3644402 -0.3059767 80
-1.6305566 -0.5564988 -0.2654891 -0.2889031 1.4152656 -0.1707590 0.1610639 -0.6537878 -0.7079900 78
0.4660916 0.8202211 -0.2654891 1.4363005 -0.4516048 -0.3452572 -0.3484102 -0.1715418 -0.4399811 78
-0.0901620 0.6324865 1.1130121 -2.0141068 -0.6009545 1.4869740 0.6422339 -0.5091140 -0.5739855 77
0.0809930 -2.8093130 -1.7205738 -0.8639710 -0.3769300 -0.8251272 1.2649245 3.7346508 -0.7079900 86
-1.4594016 -1.8080622 -1.7205738 -0.2889031 -0.5262797 -1.4794955 -1.4805749 2.1432390 -0.5739855 85
0.4233029 0.9453774 -0.3420725 0.8612326 -0.3022552 0.9634794 -0.0653690 -0.6055632 -0.7079900 80
1.7925425 0.6950647 0.0408445 0.2861647 -0.6756293 1.9232196 0.2176721 0.3589288 1.3020762 79
1.1079227 0.3195957 0.9598453 0.2861647 0.9672167 1.7923459 -0.0087608 -0.5091140 -0.5739855 79
1.7925425 0.5699084 0.0408445 0.2861647 1.1165663 1.8359704 -0.1785855 -0.4126648 1.1680718 78
0.2949367 0.4447520 -0.7249896 0.2861647 -0.7503041 -0.2580081 0.8403627 -0.5573386 -0.5739855 78
-0.0901620 -0.1184515 1.7256794 0.2861647 1.0418915 1.8359704 -1.3107502 -0.5091140 -0.5739855 77
-1.8445003 -1.0571241 -1.0313232 -0.8639710 -0.6009545 -1.2177482 0.7837545 1.1787470 -0.3059767 77
-1.5449791 -1.3700150 0.2705947 -0.2889031 -0.7503041 -0.1271344 0.3874969 1.8538914 -0.5739855 84
-0.7747818 -2.0583749 -0.2654891 -2.5891747 0.7431922 0.0037392 0.2459763 3.3488540 4.7861908 83
-0.1329507 -0.6190769 1.6490960 -0.8639710 -1.1236782 1.7923459 1.2649245 -0.6537878 -0.0379679 82
0.0809930 0.5073302 0.8066785 -0.8639710 -0.6756293 1.5742231 0.2742804 -0.4126648 -0.7079900 81
-0.7747818 0.3195957 0.0408445 0.8612326 0.8925419 -0.4761308 -0.6880596 -0.3162156 -0.4399811 81
3.7608245 -2.7467348 -0.4952393 -0.2889031 -0.3769300 -0.7378781 -1.0560132 3.5417524 -0.4399811 80
0.0809930 -2.0583749 0.1174279 -0.2889031 -0.6009545 -0.3452572 0.7554504 1.1787470 -0.7079900 79
0.2949367 -0.8693896 1.5725126 -1.4390389 -0.7503041 1.1816022 -0.5465390 2.1432390 0.2300409 79
-1.0315142 -0.0558734 0.3471781 0.2861647 -0.3769300 0.9634794 -0.9427967 -0.6055632 -0.4399811 77
0.0809930 -1.2448587 1.4193458 -2.0141068 -0.7503041 -0.3016326 -0.4899308 -0.6537878 -0.5739855 77
0.0809930 -1.1822805 0.1940113 -1.4390389 -0.4516048 -0.6506290 0.0478474 3.4453032 -0.5739855 77
0.1665704 1.0079556 0.7300951 0.8612326 1.5646152 -0.7815027 -0.3484102 -0.5091140 -0.1719723 77
1.1935002 0.1318611 -0.4952393 -0.2889031 0.6685174 -1.0432500 -0.3484102 -0.7020124 -0.4399811 77
-0.2185282 -0.9319678 -0.4952393 -0.8639710 -0.3022552 -1.6539937 0.4724092 -0.3644402 -0.4399811 84
0.0382042 0.2570175 0.9598453 0.2861647 -1.0490034 1.6614722 -0.1785855 0.0213566 -0.4399811 81
1.7925425 1.0079556 -0.8781564 0.2861647 -0.3022552 -0.4325063 -0.4899308 0.0695812 1.1680718 81
0.6800353 -1.1822805 -1.1079066 -0.2889031 2.0126641 -1.5231201 -0.8295802 -0.1715418 -0.5739855 81
0.2521479 1.3834246 -0.1123223 1.4363005 -0.2275804 -0.2143835 -0.0936732 0.2624796 -0.7079900 80
0.0809930 -1.4951713 0.9598453 -0.2889031 1.5646152 -0.7815027 0.3308886 0.6965010 -0.4399811 80
-0.2185282 0.7576429 0.1940113 -0.2889031 0.0711189 0.7889812 0.3025845 -0.6537878 -0.1719723 79
-0.5608381 -0.7442333 1.1895955 -0.2889031 -0.8996538 0.3091111 -0.3201061 -0.2197664 -0.5739855 79
-0.4324719 -1.1197023 -0.1123223 -0.8639710 -0.0035559 -0.4325063 -0.0087608 1.2269716 -0.5739855 79
-1.1170917 -1.1197023 0.6535117 -0.2889031 -0.6756293 0.1782375 -0.1785855 -0.3162156 0.0960365 78
1.3646551 0.2570175 0.8066785 -0.8639710 -0.9743286 1.0507285 -0.5182349 -0.6055632 -0.1719723 78
-0.2613169 -0.2436079 0.8832619 -0.8639710 -0.1529056 -0.6070045 0.7271463 -0.6055632 0.2300409 78
-0.6892043 0.0067048 0.1174279 -0.8639710 -0.7503041 0.7453566 -0.7729720 -0.5091140 0.3640453 78
-0.7747818 -0.5564988 0.6535117 0.8612326 -0.5262797 -0.0398853 -1.2258379 1.1305224 -0.3059767 78
0.3377254 0.2570175 0.9598453 0.8612326 -0.5262797 1.2252267 0.5856257 -0.6055632 0.0960365 77
-0.9459368 -0.3687642 1.4193458 1.4363005 -0.6756293 1.7923459 0.2459763 2.0950144 -0.0379679 87
-1.5021904 0.6950647 -1.5674070 0.8612326 -0.5262797 1.0071040 -1.0560132 -0.1715418 3.4461467 84
0.6800353 -0.9319678 1.5725126 0.2861647 -0.6756293 1.4869740 -0.5182349 -0.6537878 -0.5739855 83
-0.7747818 0.0692830 0.0408445 0.2861647 -0.3022552 -1.5667446 -0.5465390 -0.2679910 -0.7079900 82
-1.0743030 0.6950647 0.9598453 -0.2889031 -0.8249789 0.3091111 1.2366204 1.3716454 0.4980497 81
1.2790777 -0.3061861 -0.6484062 0.2861647 -0.5262797 -0.6942536 -0.4616267 -0.0750926 -0.0379679 81
-0.6892043 -0.5564988 0.3471781 0.2861647 -0.4516048 0.0909884 -0.4333226 0.5036026 -0.3059767 80
-0.7747818 -1.1197023 -0.0357389 -0.8639710 -0.6756293 0.0037392 -0.1219773 -0.1233172 -0.1719723 79
-0.6464156 -0.3687642 -0.2654891 -0.8639710 -0.0782308 -0.0398853 0.6422339 -0.5573386 -0.5739855 78
-0.0045845 -1.6203277 0.7300951 -1.4390389 1.1165663 -1.8721165 0.5573216 -0.4126648 -0.3059767 78
0.0809930 1.2582683 1.3427624 1.4363005 -0.1529056 0.0909884 -0.1785855 -0.2679910 -0.5739855 78
-1.9300778 -1.9957967 -0.4952393 -0.8639710 -0.5262797 -0.8687518 0.1327598 1.6609930 -0.7079900 77
0.2521479 0.9453774 1.1130121 1.4363005 -0.7503041 0.3091111 -0.2918020 -0.5091140 0.3640453 86
-0.5608381 -1.1822805 0.9598453 0.8612326 -0.3769300 -0.5633799 -0.2351937 0.1660304 -0.7079900 85
-0.0901620 -0.0558734 0.8832619 -0.2889031 -0.2275804 0.3963602 -0.4616267 -0.7020124 -0.1719723 84
0.1665704 0.6324865 1.7256794 0.8612326 -0.7503041 1.9668441 -0.7446679 1.1305224 -0.3059767 84
0.1665704 0.0067048 0.4237615 -0.2889031 0.8178670 -0.7378781 -0.5465390 -0.5091140 -0.3059767 84
0.9367677 0.0067048 -0.1889057 -0.8639710 -0.3769300 -0.1271344 -0.4050184 -0.4126648 -0.1719723 83
0.2949367 0.5699084 0.5769283 0.2861647 0.1457937 -0.9123764 -0.7729720 -0.1233172 -0.5739855 83
-0.7747818 -0.6190769 1.0364287 0.2861647 -0.3769300 1.2252267 -0.4616267 -0.1715418 -0.4399811 83
-0.0045845 -0.2436079 -0.4186559 -0.2889031 1.1912411 -0.0398853 0.3591927 -0.6537878 -0.4399811 82
2.3060074 -0.4313424 -1.0313232 0.2861647 -0.7503041 -0.6070045 -0.0370649 -0.3644402 -0.4399811 82
-0.6464156 0.0067048 0.1174279 0.8612326 -0.7503041 0.9634794 -1.4239667 -0.5573386 -0.4399811 82
1.0223452 0.3195957 -0.9547398 1.4363005 -0.0782308 -0.9996255 -1.2541420 0.5518272 0.0960365 82
-1.1170917 0.6324865 0.0408445 0.8612326 -0.0782308 0.1346129 -0.4050184 0.2142550 -0.4399811 80
-0.3468944 -0.6816551 1.1895955 0.2861647 -0.6009545 -0.4325063 -0.7163637 -0.3162156 -0.3059767 79
-0.5608381 0.9453774 0.4237615 0.2861647 -1.1983530 1.7923459 -0.2351937 -0.6055632 0.2300409 78
-0.5608381 0.1318611 0.7300951 0.2861647 0.8925419 0.4399848 -1.0277090 -0.7020124 -0.5739855 77
-0.3041057 -0.1810297 -1.2610734 0.2861647 -0.7503041 -0.7378781 -0.6597555 0.0213566 0.0960365 93
0.3805141 -1.5577495 -0.1889057 -0.2889031 -0.3022552 -0.0835099 -1.1692296 1.9985652 -0.7079900 92
-0.7747818 0.5073302 0.3471781 1.4363005 -0.0782308 0.6581075 -0.3201061 0.2142550 -0.4399811 90
1.7069650 -0.5564988 0.5769283 1.4363005 -0.4516048 -0.4761308 -1.4805749 -0.1715418 0.3640453 90
-0.9459368 1.1331119 0.1940113 1.4363005 4.1782338 -0.1707590 -1.8768326 0.7447256 -0.3059767 90
-1.0315142 -1.3074368 1.1895955 -0.2889031 -0.8996538 1.5305986 -1.1975337 -0.1715418 -0.1719723 87
-1.0743030 -0.3687642 -0.0357389 -1.4390389 -0.7503041 0.1346129 -0.7729720 -0.1233172 1.4360806 86
0.8511903 0.1318611 1.4959292 0.2861647 -0.5262797 1.8795950 -1.2541420 -0.5091140 -0.3059767 86
-0.8175705 1.0705337 0.1940113 1.4363005 -0.7503041 -1.2613728 -1.3390543 0.7447256 0.9000629 86
-0.6892043 -0.1810297 1.1895955 0.2861647 -0.6756293 1.3997249 -0.1502814 -0.2679910 -0.5739855 85
0.7656128 -1.9332186 1.9554296 -2.0141068 -0.8996538 -0.7378781 -0.7729720 -0.4608894 0.6320541 84
0.2949367 0.9453774 0.6535117 0.2861647 -0.5262797 0.0037392 -2.1881779 1.6127684 -0.7079900 82
0.2949367 -1.0571241 1.5725126 -0.8639710 -0.3769300 -0.3016326 0.3025845 2.0467898 -0.4399811 81
-0.6036269 -0.0558734 -0.4186559 -0.2889031 1.1912411 0.0037392 -0.2068896 0.2624796 -0.7079900 81
1.4502326 0.6950647 0.5769283 0.2861647 1.3405908 1.7050968 -1.7919202 -0.0750926 -0.4399811 78
-0.9459368 0.5699084 1.4959292 0.8612326 -0.8996538 0.4399848 -0.3484102 0.0695812 1.9720982 77

K-Fold Cross Validation

By using k-fold cross validation, it will help with imbalanced data and making sure that our training sets have an accurate representation on our overall data. We will again stratify on the response variable, pop.

set.seed(100)

# Creating folds
spotify_folds <- vfold_cv(spotify_train, v = 5, strata = pop)

With a 5-fold cross validation and a little over 600 observations in total, there will be approximately 120 observations in each fold. This should suffice with the amount of total observations we have.

Model Building

Linear Regression, Ridge Regression, K-Nearest Neighbors, Elastic Net, Random Forest, Support Vector Machine, and Boosted Trees will be the models I will be looking at.

For this process, we will follow these steps to fit them:

  1. Set up the necessary models, tune the parameters, specify the engine, and note that it is a regression problem.
# Linear Regression
lm_model <- linear_reg() %>%
  set_engine("lm")

# Ridge Regression
# Tuning penalty and letting mixture be set to 0 
ridge_model <- linear_reg(mixture = 0,
                          penalty = tune()) %>%
  set_mode("regression") %>%
  set_engine("glmnet")

# K-Nearest Neighbors
# Tuning neighbors
knn_model <- nearest_neighbor(neighbors = tune()) %>%
  set_mode("regression") %>%
  set_engine("kknn")

# Elastic Net
# Tuning penalty and mixture
en_model <- linear_reg(penalty = tune(),
                       mixture = tune()) %>%
  set_mode("regression") %>%
  set_engine("glmnet")

# Random Forest
# Tuning mtry, trees, and min_n
rf_model <- rand_forest(mtry = tune(), 
                       trees = tune(), 
                       min_n = tune()) %>% 
  set_engine("ranger", importance = "impurity") %>% 
  set_mode("regression")

# Support Vector Machine
# Tuning cost and letting degree be set to 1
svm_model <- svm_poly(degree = 1, cost = tune()) %>%
  set_engine("kernlab") %>%
  set_mode("regression")

# Boosted Trees
# Tuning mtry, trees, and learn_rate
bt_model <- boost_tree(mtry = tune(), 
                       trees = tune(), 
                       learn_rate = tune()) %>%
  set_engine("xgboost") %>% 
  set_mode("regression")
  1. Set up workflows for the models.
# Linear Regression
lm_workflow <- workflow() %>%
  add_model(lm_model) %>%
  add_recipe(spotify_recipe)

# Ridge Regression
ridge_workflow <- workflow() %>%
  add_model(ridge_model) %>%
  add_recipe(spotify_recipe)

# K-Nearest Neighbors
knn_workflow <- workflow() %>%
  add_model(knn_model) %>%
  add_recipe(spotify_recipe)

# Elastic Net
en_workflow <- workflow() %>%
  add_model(en_model) %>%
  add_recipe(spotify_recipe)

# Random Forest
rf_workflow <- workflow() %>%
  add_model(rf_model) %>%
  add_recipe(spotify_recipe)

# Support Vector Model
svm_workflow <- workflow() %>% 
  add_model(svm_model) %>%
  add_recipe(spotify_recipe)

# Boosted Trees
bt_workflow <- workflow() %>% 
  add_model(bt_model) %>% 
  add_recipe(spotify_recipe)
  1. Make a tuning grid and indicate the amount of levels for the models.
# Linear Regression (N/A because there is no tuning parameters)

# Ridge Regression
ridge_grid <- grid_regular(penalty(range = c(0.1,5)), levels = 50)

# K-Nearest Neighbors
knn_grid <- grid_regular(neighbors(range = c(1,15)), levels = 5)

# Elastic Net
en_grid <- grid_regular(penalty(range = c(0.1, 5)), mixture(range = c(0,1)), levels = 10)

# Random Forest
rf_grid <- grid_regular(mtry(range = c(1, 9)), trees(range = c(200,1000)), min_n(range = c(5,20)), levels = 8)

# Support Vector Machine
svm_grid <- grid_regular(cost(), levels = 8)

# Boosted Trees
bt_grid <- grid_regular(mtry(range = c(1, 6)), 
                        trees(range = c(200, 600)),
                        learn_rate(range = c(-10, -1)),
                        levels = 5)
  1. Tune the model, add workflows, add k-folds, and the tuning grid for models.
# Linear Regression (N/A because there is no tuning parameters)

# Ridge Regression
ridge_tune <- tune_grid(ridge_workflow,
                      resamples = spotify_folds,
                        grid = ridge_grid)

# K-Nearest Neighbors
knn_tune <- tune_grid(knn_workflow,
                      resamples = spotify_folds,
                        grid = knn_grid)

# Elastic Net
en_tune <- tune_grid(en_workflow,
                     resamples = spotify_folds,
                     grid = en_grid)

# Random Forest
rf_tune <- tune_grid(rf_workflow,
                     resamples = spotify_folds,
                     grid = rf_grid)

# Support Vector Machine
svm_tune <- tune_grid(svm_workflow,
                     resamples = spotify_folds,
                     grid = svm_grid)

# Boosted Trees
bt_tune <- tune_grid(bt_workflow, 
                    resamples = spotify_folds, 
                    grid = bt_grid)
  1. Save and load files.
# Linear Regression (N/A because there is no tuning parameters)

# Ridge Regression
save(ridge_tune, file = "ridge_tune.rda")


# K-Nearest Neighbors
save(knn_tune, file = "knn_tune.rda")


# Elastic Net
save(en_tune, file = "en_tune.rda")


# Random Forest
save(rf_tune, file = "rf_tune.rda")

# Support Vector Machine
save(svm_tune, file = "svm_tune.rda")

# Boosted Trees
save(bt_tune, file = "bt_tune.rda")
load("ridge_tune.rda")
load("knn_tune.rda")
load("en_tune.rda")
load("rf_tune.rda")
load("svm_tune.rda")
load("bt_tune.rda")
  1. Collect the metrics and see which ones have the lowest RMSE.
# Linear Regression
lm_fit <- fit_resamples(lm_workflow, resamples = spotify_folds)

(lm_rmse <- show_best(lm_fit, metric = "rmse"))
# Ridge Regression
(ridge_rmse <- show_best(ridge_tune, metric = "rmse", n = 1))
# K-Nearest Neighbor
(knn_rmse <- show_best(knn_tune, metric = "rmse", n = 1))
# Elastic Net
(en_rmse <- show_best(en_tune, metric = "rmse", n = 1))
# Random Forest
(rf_rmse <- show_best(rf_tune, metric = "rmse", n = 1))
# Support Vector Machine
(svm_rmse <- show_best(svm_tune, metric = "rmse", n = 1))
# Boosted Trees
(bt_rmse <- show_best(bt_tune, metric = "rmse", n = 1))

Model Results

Let’s find out which one of our models performed the best! We can find this by looking at the lowest RMSE value.

# Tibble comparing all of RMSE values
tibble_rmse <- tibble(Model = c("Linear Regression", "Ridge Regression", "K-Nearest Neighbors", "Elastic Net", "Random Forest", "Support Vector Machine", "Boosted Trees"), RMSE = c(lm_rmse$mean, ridge_rmse$mean, knn_rmse$mean, en_rmse$mean, rf_rmse$mean, svm_rmse$mean, bt_rmse$mean)) %>%
  arrange(RMSE)

tibble_rmse

From this chart, we can tell that K-Nearest Neighbors performed the best because it had the lowest RMSE value. The next three best models were Random Forest, Ridge Regression, and Elastic Net.

Model Autoplots

Autoplots are a great way to see how each model performed and how tuning the parameters influenced the models. We can determine how well the models did for regression problems by looking at the RMSE. Low RMSE are indicated as better than models with high RMSE.

K-Nearest Neighbors

autoplot(knn_tune, metric = "rmse")

The K-Nearest Neighbors model was tuned to 5 levels. This plot shows that with more nearest neighbors, the model does better. It has a pretty high RMSE with 1 nearest neighbors and it decreases as more neighbors are added. From this graph, we can see that the best would be with 15 neighbors.

Random Forest Autoplot

autoplot(rf_tune, metric = "rmse")

The Random Forest model has mtry, trees, and min_n tuned and contains 8 levels. From the plot, it can be seen that the number of trees do not have that much of an impact on the model. Regardless of the number, they all follow an increasing line with a slight curve. The minimal node size also does not have a huge impact on the model since they also all have the same appearance. However, it is noticeable that the number of randomly selected predictors does have an influence on the RMSE. The less predictors we have, the lower the RMSE.

Ridge Regression Autoplot

autoplot(ridge_tune, metric = "rmse")

The Ridge Regression model has penalty tuned and we set mixture equal to 0. In this plot, we can see that the amount of regularization decreases pretty quickly and then increases again, ending with a stabilizing line.

Best Model Result

Performance on the Folds

Let’s take a look on what our best model is.

(knn_rmse <- show_best(knn_tune, metric = "rmse", n = 1))

Let’s give a Grammy to KNN Model 5 with 15 neighbors and a mean of 14.16711 which did the best out of all the models!

knitr::include_graphics("images/harrystyles.png")

Fitting to Training Data

Since KNN Model 5 is our best model, we will now use it to fit to our training data.

final_knn <- select_best(knn_tune, metric = "rmse")

final_knn_workflow <- finalize_workflow(knn_workflow, final_knn)

final_knn_fit <- fit(final_knn_workflow, data = spotify_train)

save(final_knn_fit, file = "final_knn_fit.rda")
load("final_knn_fit.rda")

Testing the Model

Now that it has been fitted, we will use it on our testing data to see how well our model performed!

spotify_tibble <- predict(final_knn_fit, new_data = spotify_test
                          %>% select(-pop))
spotify_tibble <- bind_cols(spotify_tibble, spotify_test %>% select(pop))

spotify_tibble
final_knn_test <- augment(final_knn_fit, spotify_test)
rmse(final_knn_test, truth = pop, .pred)

Our model didn’t perform as well on our testing data as it did on our training but the RMSE scores were similar! To recall, the RMSE for our training set was 14.16711 and on our testing it is 15.12975. It is not the best but it did a decent job.

This is a scatterplot of the actual popularity values in the testing set versus the model-predicted values:

spotify_tibble %>% 
  ggplot(aes(x = .pred, y = pop)) +
  geom_point(alpha = 0.5) +
  geom_abline(lty = 2) + 
  coord_obs_pred() +
  theme_minimal()

When we look at the plot, we can see that our model did not do the best job at predicting the popularity value but it did not do the worst either. A few points were able to be on the line which is good but the majority of the points are not. A couple reasons why the model is not insanely accurate could be that the data might be slightly imbalanced, even after doing the k-fold cross validation. If our testing data was imbalanced, this would lead to the KNN model being biased towards the majority value. KNN is also prone to overfitting or underfitting the data so this may also be the issue.

Conclusion

By exploring the data, building several machine learning models, and analyzing the outcomes, the best model to predict the popularity of Spotify songs is K-Nearest Neighbors. However, this model did not work perfectly and could be improved.

As for improvements, I should have considered including the year and creating subsets of the data based on that. Music is always changing and trends do change from year to year so the attributes that would make a song more popular could vary. By including the year, a stratified sample could be obtained so that there is a more accurate representation of the commonalities between all the songs and the years.

Another improvement I could have made would be creating a more elaborate recipe and having a more diverse dataset. With the majority of the dataset having the similar popularity scores, this may have created a biased outcome and made it more difficult to stratify. This would make it harder to train the model and producing an inaccurate prediction.

In general, my next step would be to continue working and hopefully implement the improvements I mentioned. I think it would be super cool and useful if I am able to lower my RMSE value and have a model that could get really close to predicting more popularity scores with recent data. Overall, this was a great opportunity to work on my machine learning skills and finding myself extremely passionate in creating more projects that involve machine learning. Hopefully, I can utilize the skills learned to make more projects and models that are worth a Grammy in my book!

knitr::include_graphics("images/beyonce.png")

Sources

The data set was called Top Spotify Songs From 2010-2019 By Year by Leonardo Henrique and it was retrieved from Kaggle.

Information in the introduction was from prior knowledge and from the Spotify site.